1. Get an axe.
2. Cut some wood.
3. Sharpen the axe.
4. Cut more wood.
5. Note the difference.
It's still hard work, but it's a good deal easier with a sharp tool. The thing that catches you unaware is that the tool goes dull as you use it and needs to be sharpened again. It's a lot like the frog in a pot of water being slowly brought to a boil -- you don't realize the incremental decrease in sharpness until you put a new edge on it.
But, we have power tools!
1. Get a chainsaw.
2. Cut some wood.
3. Sharpen the chainsaw.
4. Cut more wood.
5. Note the difference.
A dull power tool is still innefficient. It just happens to have enough excess power that you maybe don't realize the dullness as easily. When the tool is sharp, more power goes to cutting and less to heat (this is why you usually cut/drill metal with a lubricant.)
How does this work in the context of software? The tools don't get duller in an absolute sense, but they can get less sharp relative to the advances in theory, hardware (e.g. running single-threaded on a multi-core system), etc. They also get dull with regard to the challenge of solving new problems -- as in: none of us would have jobs if all of the problems had been solved.
So, have you sharpened your tools lately? Have you stopped to wonder if they are dull?